Relations between Renyi Distance and Fisher Information

نویسندگان

  • M. Abbasnejad
  • N. R. Arghami
چکیده مقاله:

In this paper, we first show that Renyi distance between any member of a parametric family and its perturbations, is proportional to its Fisher information. We, then, prove some relations between the Renyi distance of two distributions and the Fisher information of their exponentially twisted family of densities. Finally, we show that the partial ordering of families induced by Renyi distance is the same as that induced by Fisher information.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Fisher information distance: a geometrical reading?

This paper is a strongly geometrical approach to the Fisher distance, which is a measure of dissimilarity between two probability distribution functions. The Fisher distance, as well as other divergence measures, are also used in many applications to establish a proper data average. The main purpose is to widen the range of possible interpretations and relations of the Fisher distance and its a...

متن کامل

Cramér-Rao and moment-entropy inequalities for Renyi entropy and generalized Fisher information

The moment-entropy inequality shows that a continuous random variable with given second moment and maximal Shannon entropy must be Gaussian. Stam’s inequality shows that a continuous random variable with given Fisher information and minimal Shannon entropy must also be Gaussian. The CramérRao inequality is a direct consequence of these two inequalities. In this paper the inequalities above are ...

متن کامل

Extreme Fisher Information, Non-Equilibrium Thermodynamics and Reciprocity Relations

In employing MaxEnt, a crucial role is assigned to the reciprocity relations that relate the quantifier to be extremized (Shannon’s entropy S), the Lagrange multipliers that arise during the variational process, and the expectation values that constitute the a priori input information. We review here just how these ingredients relate to each other when the information quantifier S is replaced b...

متن کامل

Potential Statistical Evidence in Experiments and Renyi Information

Recently Habibi et al. (2006) defined a pre-experimental criterion for the potential strength of evidence provided by an experiment, based on Kullback-Leibler distance. In this paper, we investigate the potential statistical evidence in an experiment in terms of Renyi distance and compare the potential statistical evidence in lower (upper) record values with that in the same number of ii...

متن کامل

Baryonic Tully-Fisher Relations

I describe the disk mass–rotation velocity relation which underpins the familiar luminosity–linewidth relation. Continuity of this relation favors nearly maximal stellar mass-to-light ratios. This contradicts the low mass-to-light ratios implied by the lack of surface brightness dependence in the same relation. 1. Searching for the Physical Basis of the Tully-Fisher Relation The Tully-Fisher (T...

متن کامل

Shannon Entropy , Renyi Entropy , and Information

This memo contains proofs that the Shannon entropy is the limiting case of both the Renyi entropy and the Tsallis entropy, or information. These results are also confirmed experimentally. We conclude with some general observations on the utility of entropy measures. A brief summary of the origins of the concept of physical entropy are provided in an appendix.

متن کامل

منابع من

با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ذخیره در منابع من قبلا به منابع من ذحیره شده

{@ msg_add @}


عنوان ژورنال

دوره 5  شماره None

صفحات  25- 37

تاریخ انتشار 2006-11

با دنبال کردن یک ژورنال هنگامی که شماره جدید این ژورنال منتشر می شود به شما از طریق ایمیل اطلاع داده می شود.

کلمات کلیدی

میزبانی شده توسط پلتفرم ابری doprax.com

copyright © 2015-2023